Goto

Collaborating Authors

 end-to-end process


How to Measure Automation Success for the Enterprise

#artificialintelligence

Automation has become widely recognized for saving employees time and effort by carrying out high volume, repetitive and typically error-prone tasks. By utilizing robotic process automation (RPA), enterprises can more easily manage their manual, time-intensive tasks while also boosting accuracy, timeliness, and compliance. RPA also has been leveraged for cost avoidance. During the COVID-19 pandemic, companies across a variety of industries used automation to ensure business continuity and optimize costs. For example, airline companies developed automated processes to refund customers and reschedule flights.


Pycaret: A Faster Way to Build Machine Learning Models

#artificialintelligence

Building a machine learning model requires a series of steps, from data preparation, data cleaning, feature engineering, model building to model deployment. Therefore, it can take a lot of time for a data scientist to create a solution that solves a business problem. To help speed up the process, you can use Pycaret, an open-source library. Pycaret can help you perform all the end-to-end processes of ML faster with few lines of code. Pycaret is an open-source, low code library in python that aims to automate the development of machine learning models.


Intelligent automation depends on these 4 cornerstones

#artificialintelligence

There has been more than a modicum of buzz around what IDC calls intelligent process automation and what Gartner calls hyperautomation. In both cases, these terms refer to the integrated deployment of digital technologies such as robotic process automation (RPA), intelligent business process management suites (iBPMS), artificial intelligence, process mining, etc. Integrating digital technologies is far from a new concept. MIT and Deloitte advocated this approach back in the day when everyone was focused on social, mobile, analytics, and cloud (SMAC). Digital transformation is a complex undertaking. Frankly, the track record on digital transformation success is dismal, with as few as 30% of such initiatives succeeding. The integration of digital technologies may just be the lever that leads to more success.


Introducing Our Low Code Machine Learning Platform

#artificialintelligence

We are very excited to release the free tier of dunnhumby Model Lab this as part of our partnership with Microsoft. We make it easy to connect your data, clean your data, and run your machine learning pipeline within minutes. You can then take that output and copy right into a notebook for further refinement if needed. You can create new projects, reference datasets, and create multiple experiments in just a few clicks! You can also follow the progress of your machine learning experiments as they update in real-time.


Continuous delivery for machine learning - DevOps Conference

#artificialintelligence

As organizations move to become more "data-driven" or "AI-driven", it's increasingly important to incorporate data science and data engineering approaches into the software development process to avoid silos that hinder efficient collaboration and alignment. However, this integration also brings new challenges when compared to traditional software development. Not only do we have to manage the software code artifacts, but also the data sets, the machine learning models, and the parameters and hyperparameters used by such models. All these artifacts have to be managed, versioned, and promoted through different stages until they're deployed to production. It's harder to achieve versioning, quality control, reliability, repeatability and audibility in that process.


What is AutoML (Automated Machine Learning)? AISOMA AG Frankfurt

#artificialintelligence

AutoML enjoys a steadily increasing popularity (see Forbes). Not least driven by the numerous successes in practical analyses. In a world in which more and more devices produce data and are networked with each other, the data "produced" grows disproportionately. Therefore AutoML is of urgent necessity to gain knowledge from these rapidly increasing data on time. We assume that AutoML becomes even more critical in the coming years and that the analysis methods deliver even more precise and faster results.


New AI platform tackles ten steps of AutoML - SD Times

#artificialintelligence

The combination of Aible Advanced for data scientists and Aible for business people allows experts to scale themselves by enforcing best practices while easily soliciting business user input to maximize business impact. Aible Advanced users can solicit feedback from Aible users or even delegate specific steps of the end-to-end process to specific business people using Aible,


Robotic Process Automation (RPA) vs. AI, explained

#artificialintelligence

The expanding universe of artificial intelligence includes many terms and technologies. That naturally leads to overlap and confusion. AI and machine learning are mentioned together so often that some people – non-technical folks especially – might think they're one and the same. They're related but not actually interchangeable terms: Machine learning is a subset, or a specific discipline, of AI. Start adding other terms and technologies into the mix – deep learning is yet another subset of machine learning, for instance – and the opportunities abound for further misconceptions. Deciphering the differences between terms and technologies takes a twist with robotic process automation (RPA) and AI.


Automated Tasks and #ArtificialIntelligence @CloudExpo #AI #ML #BigData

#artificialintelligence

The next BriefingsDirect technology innovation thought leadership discussion explores how rapid advances in artificial intelligence (AI) and machine learning are poised to reshape procurement -- like a fast-forwarding to a once-fanciful vision of the future. Whereas George Jetson of the 1960s cartoon portrayed a world of household robots, flying cars, and push-button corporate jobs -- the 2017 procurement landscape has its own impressive retinue of decision bots, automated processes, and data-driven insights. We won't need to wait long for this vision of futuristic business to arrive. As we enter 2017, applied intelligence derived from entirely new data analysis benefits has redefined productivity and provided business leaders with unprecedented tools for managing procurement, supply chains, and continuity risks. To learn more about the future of predictive -- and even proactive -- procurement technologies, please welcome Chris Haydon, Chief Strategy Officer at SAP Ariba. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions. Gardner: It seems like only yesterday that we were content to gain a common view of the customer or develop an end-to-end bead on a single business process.


Banking on Bots: The Move towards Digital Labor in Financial Services

#artificialintelligence

Over the last two decades, banks and financial institutions have achieved significant efficiencies through outsourcing, offshoring and labor arbitrage. Those levers can only go so far, however, when it comes to further cutting costs and simultaneously growing revenues -- particularly at a time when disintermediation is increasing and budgets are diminishing. To address this, enter the "bots": Financial services institutions including retail banks, investment banks, custodians, commercial banks and insurers are turning to digital labor, which represents a new wave of technologies such as robotics process automation (RPA) and cognitive automation. "Bots" are not the electromechanical robots with arms and legs one may have seen in sci-fi movies. Instead, this refers to software configured to interact with computers and applications just like a human would, performing high-volume, repetitive processes and tasks.